Revisiting L 2,1-Norm Robustness With Vector Outlier Regularization
نویسندگان
چکیده
منابع مشابه
Outlier Regularization for Vector Data and L21 Norm Robustness
In many real-world applications, data usually contain outliers. One popular approach is to use norm function as a robust error/loss function. However, the robustness of norm function is not well understood so far. In this paper, we propose a new Vector Outlier Regularization (VOR) framework to understand and analyze the robustness of data point to be outlier if it is outside a threshold with re...
متن کاملL1-norm Error Function Robustness and Outlier Regularization
In many real-world applications, data come with corruptions, large errors or outliers. One popular approach is to use -norm function. However, the robustness of -norm function is not well understood so far. In this paper, we present a new outlier regularization framework to understand and analyze the robustness of -norm function. There are two main features for the proposed outlier regularizati...
متن کاملRobustness and Regularization of Support Vector Machines
We consider regularized support vector machines (SVMs) and show that they are precisely equivalent to a new robust optimization formulation. We show that this equivalence of robust optimization and regularization has implications for both algorithms, and analysis. In terms of algorithms, the equivalence suggests more general SVM-like algorithms for classification that explicitly build in protec...
متن کاملRobustness and Regularization of SVMs Robustness and Regularization of Support Vector Machines
We consider regularized support vector machines (SVMs) and show that they are precisely equivalent to a new robust optimization formulation. We show that this equivalence of robust optimization and regularization has implications for both algorithms, and analysis. In terms of algorithms, the equivalence suggests more general SVM-like algorithms for classification that explicitly build in protec...
متن کاملEstimation with Norm Regularization
Analysis of non-asymptotic estimation error and structured statistical recovery based on norm regularized regression, such as Lasso, needs to consider four aspects: the norm, the loss function, the design matrix, and the noise model. This paper presents generalizations of such estimation error analysis on all four aspects compared to the existing literature. We characterize the restricted error...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks and Learning Systems
سال: 2020
ISSN: 2162-237X,2162-2388
DOI: 10.1109/tnnls.2020.2964297